A Second Order Primal-Dual Method for Nonsmooth Convex Composite Optimization

نویسندگان

چکیده

We develop a second order primal-dual method for optimization problems in which the objective function is given by sum of strongly convex twice differentiable term and possibly nondifferentiable regularizer. After introducing an auxiliary variable, we utilize proximal operator nonsmooth regularizer to transform associated augmented Lagrangian into that once, but not twice, continuously differentiable. The saddle point this corresponds solution original problem. employ generalization Hessian define second-order updates on prove global exponential stability corresponding differential inclusion. Furthermore, globally convergent customized algorithm utilizes as merit function. show search direction can be computed efficiently quadratic/superlinear asymptotic convergence. use $\ell _1$ -regularized model predictive control problem designing distributed controller spatially invariant system demonstrate merits effectiveness our method.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A second order primal-dual method for nonsmooth convex composite optimization

We develop a second order primal-dual method for optimization problems in which the objective function is given by the sum of a strongly convex twice differentiable term and a possibly nondifferentiable convex regularizer. After introducing an auxiliary variable, we utilize the proximal operator of the nonsmooth regularizer to transform the associated augmented Lagrangian into a function that i...

متن کامل

Smooth Primal-Dual Coordinate Descent Algorithms for Nonsmooth Convex Optimization

We propose a new randomized coordinate descent method for a convex optimization template with broad applications. Our analysis relies on a novel combination of four ideas applied to the primal-dual gap function: smoothing, acceleration, homotopy, and coordinate descent with non-uniform sampling. As a result, our method features the first convergence rate guarantees among the coordinate descent ...

متن کامل

A Primal-Dual Splitting Method for Convex Optimization Involving Lipschitzian, Proximable and Linear Composite Terms

We propose a new first-order splitting algorithm for solving jointly the primal and dual formulations of large-scale convex minimization problems involving the sum of a smooth function with Lipschitzian gradient, a nonsmooth proximable function, and linear composite functions. This is a full splitting approach, in the sense that the gradient and the linear operators involved are applied explici...

متن کامل

Primal-dual exterior point method for convex optimization

We introduce and study the primal-dual exterior point (PDEP) method for convex optimization problems. The PDEP is based on the Nonlinear Rescaling (NR) multipliers method with dynamic scaling parameters update. The NR method at each step alternates finding the unconstrained minimizer of the Lagrangian for the equivalent problem with both Lagrange multipliers and scaling parameters vectors updat...

متن کامل

Primal-Dual Lagrangian Transformation method for Convex Optimization

Received: date / Revised version: date Abstract. A class Ψ of strongly concave and smooth functions ψ : R → R with specific properties is used to transform the terms of the classical Lagrangian associated with the constraints. The transformation is scaled by a positive vector of scaling parameters, one for each constraint. Each step of the Lagrangian Transformation (LT) method alternates uncons...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: IEEE Transactions on Automatic Control

سال: 2022

ISSN: ['0018-9286', '1558-2523', '2334-3303']

DOI: https://doi.org/10.1109/tac.2021.3115449